A metric entropy bound is not sufficient for learnability

نویسندگان

  • R. M. Dudley
  • Sanjeev R. Kulkarni
  • T. J. Richardson
  • Ofer Zeitouni
چکیده

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning with prior information

In this paper, a new notion of learnability is introduced, referred to as learnability with prior information (w.p.i.). This notion is weaker than the standard notion of probably approximately correct (PAC) learnability which has been much studied during recent years. A property called “dispersability” is introduced, and it is shown that dispersability plays a key role in the study of learnabil...

متن کامل

September 1989 LIDS - P - 1910 On Metric Entropy , Vapnik - Chervonenkis Dimension , and Learnability for a Class of Distributions 1

In [23], Valiant proposed a formal framework for distribution-free concept learning which has generated a great deal of interest. A fundamental result regarding this framework was proved by Blumer et al. [6] characterizing those concept classes which are learnable in terms of their Vapnik-Chervonenkis (VC) dimension. More recently, Benedek and Itai [4] studied learnability with respect to a fix...

متن کامل

Mind Change Complexity of Learning Logic Programs

The present paper motivates the study of mind change complexity for learning minimal models of length-bounded logic programs. It establishes ordinal mind change complexity bounds for learnability of these classes both from positive facts and from positive and negative facts. Building on Angluin’s notion of finite thickness and Wright’s work on finite elasticity, Shinohara defined the property o...

متن کامل

The Rate of Rényi Entropy for Irreducible Markov Chains

In this paper, we obtain the Rényi entropy rate for irreducible-aperiodic Markov chains with countable state space, using the theory of countable nonnegative matrices. We also obtain the bound for the rate of Rényi entropy of an irreducible Markov chain. Finally, we show that the bound for the Rényi entropy rate is the Shannon entropy rate.

متن کامل

The Entropy of Words - Learnability and Expressivity across More than 1000 Languages

The choice associated with words is a fundamental property of natural languages. It lies at the heart of quantitative linguistics, computational linguistics, and language sciences more generally. Information-theory gives us tools at hand to measure precisely the average amount of choice associated with words – the word entropy. Here we use three parallel corpora – encompassing ca. 450 million w...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 40  شماره 

صفحات  -

تاریخ انتشار 1994